Your browser doesn't support javascript.
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Añadir filtros

Base de datos
Tipo del documento
Intervalo de año
1.
PLoS One ; 17(3): e0263916, 2022.
Artículo en Inglés | MEDLINE | ID: covidwho-1742004

RESUMEN

OBJECTIVES: Ground-glass opacity (GGO)-a hazy, gray appearing density on computed tomography (CT) of lungs-is one of the hallmark features of SARS-CoV-2 in COVID-19 patients. This AI-driven study is focused on segmentation, morphology, and distribution patterns of GGOs. METHOD: We use an AI-driven unsupervised machine learning approach called PointNet++ to detect and quantify GGOs in CT scans of COVID-19 patients and to assess the severity of the disease. We have conducted our study on the "MosMedData", which contains CT lung scans of 1110 patients with or without COVID-19 infections. We quantify the morphologies of GGOs using Minkowski tensors and compute the abnormality score of individual regions of segmented lung and GGOs. RESULTS: PointNet++ detects GGOs with the highest evaluation accuracy (98%), average class accuracy (95%), and intersection over union (92%) using only a fraction of 3D data. On average, the shapes of GGOs in the COVID-19 datasets deviate from sphericity by 15% and anisotropies in GGOs are dominated by dipole and hexapole components. These anisotropies may help to quantitatively delineate GGOs of COVID-19 from other lung diseases. CONCLUSION: The PointNet++ and the Minkowski tensor based morphological approach together with abnormality analysis will provide radiologists and clinicians with a valuable set of tools when interpreting CT lung scans of COVID-19 patients. Implementation would be particularly useful in countries severely devastated by COVID-19 such as India, where the number of cases has outstripped available resources creating delays or even breakdowns in patient care. This AI-driven approach synthesizes both the unique GGO distribution pattern and severity of the disease to allow for more efficient diagnosis, triaging and conservation of limited resources.


Asunto(s)
COVID-19/diagnóstico por imagen , Pulmón/patología , Interpretación de Imagen Radiográfica Asistida por Computador/métodos , Inteligencia Artificial , COVID-19/patología , Femenino , Humanos , India , Pulmón/diagnóstico por imagen , Masculino , Gravedad del Paciente , Estudios Retrospectivos , Tomografía Computarizada por Rayos X/métodos , Aprendizaje Automático no Supervisado
2.
Comput Biol Med ; 143: 105298, 2022 Feb 20.
Artículo en Inglés | MEDLINE | ID: covidwho-1693721

RESUMEN

The COVID-19 (coronavirus disease 2019) pandemic affected more than 186 million people with over 4 million deaths worldwide by June 2021. The magnitude of which has strained global healthcare systems. Chest Computed Tomography (CT) scans have a potential role in the diagnosis and prognostication of COVID-19. Designing a diagnostic system, which is cost-efficient and convenient to operate on resource-constrained devices like mobile phones would enhance the clinical usage of chest CT scans and provide swift, mobile, and accessible diagnostic capabilities. This work proposes developing a novel Android application that detects COVID-19 infection from chest CT scans using a highly efficient and accurate deep learning algorithm. It further creates an attention heatmap, augmented on the segmented lung parenchyma region in the chest CT scans which shows the regions of infection in the lungs through an algorithm developed as a part of this work, and verified through radiologists. We propose a novel selection approach combined with multi-threading for a faster generation of heatmaps on a Mobile Device, which reduces the processing time by about 93%. The neural network trained to detect COVID-19 in this work is tested with a F1 score and accuracy, both of 99.58% and sensitivity of 99.69%, which is better than most of the results in the domain of COVID diagnosis from CT scans. This work will be beneficial in high-volume practices and help doctors triage patients for the early diagnosis of COVID-19 quickly and efficiently.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA